Tags: machine learning*

"Machine learning is a subset of artificial intelligence in the field of computer science that often uses statistical techniques to give computers the ability to "learn" (i.e., progressively improve performance on a specific task) with data, without being explicitly programmed.

https://en.wikipedia.org/wiki/Machine_learning

0 bookmark(s) - Sort by: Date ↓ / Title /

  1. A lightweight intelligent solution designed to monitor and identify abnormal vibration patterns in real-time. The project details the hardware (XIAO ESP32-S3, Grove Shield, LIS3DHTR accelerometer) and software (SenseCraft AI platform) used to detect vibration anomalies. It explains the GEDAD algorithm used for learning normal vibration patterns and identifying deviations.
  2. A new study by MIT CSAIL researchers maps the challenges of AI in software development, identifying bottlenecks and highlighting research directions to move the field forward, aiming to allow humans to focus on high-level design while automating routine tasks.
  3. DeepMind introduces Ithaca, a deep neural network that can restore damaged ancient Greek inscriptions, identify their original location, and help establish their creation date, collaborating with historians to advance understanding of ancient history.
  4. Introducing Aeneas, the first AI model for contextualizing ancient inscriptions, designed to help historians better interpret, attribute, and restore fragmentary texts. It reasons across thousands of Latin inscriptions, retrieving textual and contextual parallels to aid in historical research.
  5. This blog post details the training of 'Chess Llama', a small Llama model designed to play chess. It covers the inspiration behind the project (Chess GPT), the dataset used (Lichess Elite database), the training process using Huggingface Transformers, and the model's performance (Elo rating of 1350-1400). It also includes links to try the model and view the source code.
  6. A detailed comparison of the architectures of recent large language models (LLMs) including DeepSeek-V3, OLMo 2, Gemma 3, Mistral Small 3.1, Llama 4, Qwen3, SmolLM3, and Kimi 2, focusing on key design choices and their impact on performance and efficiency.
  7. A deep dive into advanced evaluation for data scientists, discussing why accuracy is often misleading and exploring alternative metrics for classification and regression tasks like ROC-AUC, Log Loss, R², RMSLE, and Quantile Loss.
  8. The article discusses using Large Language Model (LLM) embeddings as features in traditional machine learning models built with scikit-learn. It covers the process of generating embeddings from text data using models like Sentence Transformers, and how these embeddings can be combined with existing features to improve model performance. It details practical steps including loading data, creating embeddings, and integrating them into a scikit-learn pipeline for tasks like classification.
  9. This page details the topic namers available in Turftopic, allowing automated assignment of human-readable names to topics. It covers Large Language Models (local and OpenAI), N-gram patterns, and provides API references for the `TopicNamer`, `LLMTopicNamer`, `OpenAITopicNamer`, and `NgramTopicNamer` classes.
  10. Python tutorial for reproducible labeling of cutting-edge topic models with GPT4-o-mini. The article details training a FASTopic model and labeling its results using GPT-4.0 mini, emphasizing reproducibility and control over the labeling process.

Top of the page

First / Previous / Next / Last / Page 3 of 0 SemanticScuttle - klotz.me: tagged with "machine learning"

About - Propulsed by SemanticScuttle